Gradient Flow Algorithms for Density Propagation in Stochastic Systems
نویسندگان
چکیده
منابع مشابه
Extended stochastic gradient identification algorithms for Hammerstein-Wiener ARMAX systems
An extended stochastic gradient algorithm is developed to estimate the parameters of Hammerstein–Wiener ARMAX models. The basic idea is to replace the unmeasurable noise terms in the information vector of the pseudo-linear regression identification model with the corresponding noise estimates which are computed by the obtained parameter estimates. The obtained parameter estimates of the identif...
متن کاملOn Stochastic Proximal Gradient Algorithms
We study a perturbed version of the proximal gradient algorithm for which the gradient is not known in closed form and should be approximated. We address the convergence and derive a non-asymptotic bound on the convergence rate for the perturbed proximal gradient, a perturbed averaged version of the proximal gradient algorithm and a perturbed version of the fast iterative shrinkagethresholding ...
متن کاملConvergence Analysis of Gradient Descent Stochastic Algorithms
This paper proves convergence of a sample-path based stochastic gradient-descent algorithm for optimizing expected-value performance measures in discrete event systems. The algorithm uses increasing precision at successive iterations, and it moves against the direction of a generalized gradient of the computed sample performance function. Two convergence results are established: one, for the ca...
متن کاملA Variational Analysis of Stochastic Gradient Algorithms
Stochastic Gradient Descent (SGD) is an important algorithm in machine learning. With constant learning rates, it is a stochastic process that, after an initial phase of convergence, generates samples from a stationary distribution. We show that SGD with constant rates can be effectively used as an approximate posterior inference algorithm for probabilistic modeling. Specifically, we show how t...
متن کاملIntrinsic Geometry of Stochastic Gradient Descent Algorithms
We consider the intrinsic geometry of stochastic gradient descent (SG) algorithms. We show how to derive SG algorithms that fully respect an underlying geometry which can be induced by either prior knowledge in the form of a preferential structure or a generative model via the Fisher information metric. We show that using the geometrically motivated update and the “correct” loss function, the i...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Automatic Control
سال: 2020
ISSN: 0018-9286,1558-2523,2334-3303
DOI: 10.1109/tac.2019.2951348